93 research outputs found

    External inverse pattern matching

    Get PDF
    We consider {\sl external inverse pattern matching} problem. Given a text \t of length nn over an ordered alphabet Ξ£\Sigma, such that ∣Σ∣=Οƒ|\Sigma|=\sigma, and a number m≀nm\le n. The entire problem is to find a pattern \pe\in \Sigma^m which is not a subword of \t and which maximizes the sum of Hamming distances between \pe and all subwords of \t of length mm. We present optimal O(nlog⁑σ)O(n\log\sigma)-time algorithm for the external inverse pattern matching problem which substantially improves the only known polynomial O(nmlog⁑σ)O(nm\log\sigma)-time algorithm introduced by Amir, Apostolico and Lewenstein. Moreover we discuss a fast parallel implementation of our algorithm on the CREW PRAM model

    The Power of Verification for Greedy Mechanism Design

    Get PDF
    Greedy algorithms are known to provide near optimal approximation guarantees for Combinatorial Auctions (CAs) with multidimensional bidders, ignoring incentive compatibility. Borodin and Lucier [5] however proved that truthful greedy-like mechanisms for CAs with multi-minded bidders do not achieve good approximation guarantees. In this work, we seek a deeper understanding of greedy mechanism design and investigate under which general assumptions, we can have efficient and truthful greedy mechanisms for CAs. Towards this goal, we use the framework of priority algorithms and weak and strong verification, where the bidders are not allowed to overbid on their winning set or on any subsets of this set, respectively. We provide a complete characterization of the power of weak verification showing that it is sufficient and necessary for any greedy fixed priority algorithm to become truthful with the use of money or not, depending on the ordering of the bids. Moreover, we show that strong verification is sufficient and necessary for the greedy algorithm of [20], which is 2-approximate for submodular CAs, to become truthful with money in finite bidding domains. Our proof is based on an interesting structural analysis of the strongly connected components of the declaration graph

    The Power of Verification for Greedy Mechanism Design

    Get PDF
    Greedy algorithms are known to provide, in polynomial time, near optimal approximation guarantees for Combinatorial Auctions (CAs) with multidimensional bidders. It is known that truthful greedy-like mechanisms for CAs with multi-minded bidders do not achieve good approximation guarantees. In this work, we seek a deeper understanding of greedy mechanism design and investigate under which general assumptions, we can have efficient and truthful greedy mechanisms for CAs. Towards this goal, we use the framework of priority algorithms and weak and strong verification, where the bidders are not allowed to overbid on their winning set or on any subset of this set, respectively. We provide a complete characterization of the power of weak verification showing that it is sufficient and necessary for any greedy fixed priority algorithm to become truthful with the use of money or not, depending on the ordering of the bids. Moreover, we show that strong verification is sufficient and necessary to obtain a 2-approximate truthful mechanism with money, based on a known greedy algorithm, for the problem of submodular CAs in finite bidding domains. Our proof is based on an interesting structural analysis of the strongly connected components of the declaration graph

    Local Fault-tolerant Quantum Computation

    Full text link
    We analyze and study the effects of locality on the fault-tolerance threshold for quantum computation. We analytically estimate how the threshold will depend on a scale parameter r which estimates the scale-up in the size of the circuit due to encoding. We carry out a detailed semi-numerical threshold analysis for concatenated coding using the 7-qubit CSS code in the local and `nonlocal' setting. First, we find that the threshold in the local model for the [[7,1,3]] code has a 1/r dependence, which is in correspondence with our analytical estimate. Second, the threshold, beyond the 1/r dependence, does not depend too strongly on the noise levels for transporting qubits. Beyond these results, we find that it is important to look at more than one level of concatenation in order to estimate the threshold and that it may be beneficial in certain places, like in the transportation of qubits, to do error correction only infrequently.Comment: REVTeX, 44 pages, 19 figures, to appear in Physical Review

    Resource Competition on Integral Polymatroids

    Full text link
    We study competitive resource allocation problems in which players distribute their demands integrally on a set of resources subject to player-specific submodular capacity constraints. Each player has to pay for each unit of demand a cost that is a nondecreasing and convex function of the total allocation of that resource. This general model of resource allocation generalizes both singleton congestion games with integer-splittable demands and matroid congestion games with player-specific costs. As our main result, we show that in such general resource allocation problems a pure Nash equilibrium is guaranteed to exist by giving a pseudo-polynomial algorithm computing a pure Nash equilibrium.Comment: 17 page

    Combinatorial Auctions without Money

    Get PDF
    Algorithmic Mechanism Design attempts to marry computation and incentives, mainly by leveraging monetary transfers between designer and selfish agents involved. This is principally because in absence of money, very little can be done to enforce truthfulness. However, in certain applications, money is unavailable, morally unacceptable or might simply be at odds with the objective of the mechanism. For example, in Combinatorial Auctions (CAs), the paradigmatic problem of the area, we aim at solutions of maximum social welfare, but still charge the society to ensure truthfulness. We focus on the design of incentive-compatible CAs without money in the general setting of k-minded bidders. We trade monetary transfers with the observation that the mechanism can detect certain lies of the bidders: i.e., we study truthful CAs with verification and without money. In this setting, we characterize the class of truthful mechanisms and give a host of upper and lower bounds on the approximation ratio obtained by either deterministic or randomized truthful mechanisms. Our results provide an almost complete picture of truthfully approximating CAs in this general setting with multi-dimensional bidders

    Mechanism Design for Perturbation Stable Combinatorial Auctions

    Full text link
    Motivated by recent research on combinatorial markets with endowed valuations by (Babaioff et al., EC 2018) and (Ezra et al., EC 2020), we introduce a notion of perturbation stability in Combinatorial Auctions (CAs) and study the extend to which stability helps in social welfare maximization and mechanism design. A CA is Ξ³-stable\gamma\textit{-stable} if the optimal solution is resilient to inflation, by a factor of Ξ³β‰₯1\gamma \geq 1, of any bidder's valuation for any single item. On the positive side, we show how to compute efficiently an optimal allocation for 2-stable subadditive valuations and that a Walrasian equilibrium exists for 2-stable submodular valuations. Moreover, we show that a Parallel 2nd Price Auction (P2A) followed by a demand query for each bidder is truthful for general subadditive valuations and results in the optimal allocation for 2-stable submodular valuations. To highlight the challenges behind optimization and mechanism design for stable CAs, we show that a Walrasian equilibrium may not exist for Ξ³\gamma-stable XOS valuations for any Ξ³\gamma, that a polynomial-time approximation scheme does not exist for (2βˆ’Ο΅)(2-\epsilon)-stable submodular valuations, and that any DSIC mechanism that computes the optimal allocation for stable CAs and does not use demand queries must use exponentially many value queries. We conclude with analyzing the Price of Anarchy of P2A and Parallel 1st Price Auctions (P1A) for CAs with stable submodular and XOS valuations. Our results indicate that the quality of equilibria of simple non-truthful auctions improves only for Ξ³\gamma-stable instances with Ξ³β‰₯3\gamma \geq 3

    Resistance Exercise Reverses Aging in Human Skeletal Muscle

    Get PDF
    Human aging is associated with skeletal muscle atrophy and functional impairment (sarcopenia). Multiple lines of evidence suggest that mitochondrial dysfunction is a major contributor to sarcopenia. We evaluated whether healthy aging was associated with a transcriptional profile reflecting mitochondrial impairment and whether resistance exercise could reverse this signature to that approximating a younger physiological age. Skeletal muscle biopsies from healthy older (Nβ€Š=β€Š25) and younger (Nβ€Š=β€Š26) adult men and women were compared using gene expression profiling, and a subset of these were related to measurements of muscle strength. 14 of the older adults had muscle samples taken before and after a six-month resistance exercise-training program. Before exercise training, older adults were 59% weaker than younger, but after six months of training in older adults, strength improved significantly (P<0.001) such that they were only 38% lower than young adults. As a consequence of age, we found 596 genes differentially expressed using a false discovery rate cut-off of 5%. Prior to the exercise training, the transcriptome profile showed a dramatic enrichment of genes associated with mitochondrial function with age. However, following exercise training the transcriptional signature of aging was markedly reversed back to that of younger levels for most genes that were affected by both age and exercise. We conclude that healthy older adults show evidence of mitochondrial impairment and muscle weakness, but that this can be partially reversed at the phenotypic level, and substantially reversed at the transcriptome level, following six months of resistance exercise training
    • …
    corecore